104 research outputs found
The foundational legacy of ASL
Abstract. We recall the kernel algebraic specification language ASL and outline its main features in the context of the state of research on algebraic specification at the time it was conceived in the early 1980s. We discuss the most significant new ideas in ASL and the influence they had on subsequent developments in the field and on our own work in particular.
Maximum-entropy theory of steady-state quantum transport
We develop a theoretical framework for describing steady-state quantum transport phenomena, based on the general maximum-entropy principle of nonequilibrium statistical mechanics. The general form of the many-body density matrix is derived, which contains the invariant part of the current operator that guarantees the nonequilibrium and steady-state character of the ensemble. Several examples of the theory are given, demonstrating the relationship of the present treatment to the widely used scattering-state occupation schemes at the level of the self-consistent single-particle approximation. The latter schemes are shown not to maximize the entropy, except in certain limits
Symbolic and analytic techniques for resource analysis of Java bytecode
Recent work in resource analysis has translated the idea of amortised resource analysis to imperative languages using a program logic that allows mixing of assertions about heap shapes, in the tradition of separation logic, and assertions about consumable resources. Separately, polyhedral methods have been used to calculate bounds on numbers of iterations in loop-based programs. We are attempting to combine these ideas to deal with Java programs involving both data structures and loops, focusing on the bytecode level rather than on source code
A Kernel Specification Formalism with Higher-Order Parameterisation
A specification formalism with parameterisation of an arbitrary order is presented. It is given a denotational-style semantics, accompanied by an inference system for proving that an object satisfies a specification. The inference system incorporates, but is not limited to, a clearly identified type-checking component. Special effort is made to carefully distinguish between parameterised specifications, which denote functions yielding classes of objects, and specifications of parameterised objects, which denote classes of functions yielding objects. To deal with both of these in a uniform framework, it was convenient to view specifications, which specify objects, as objects themselves, and to introduce a notion of a specification of specifications. The formalism includes the basic specification-building operations of the ASL specification language. This choice, however, is orthogonal to the new ideas presented. The formalism is also institution-independent, although this iss..
Semantics, Implementation and Pragmatics of Clear, a Program Specification Language
Specifications are necessary for communicating decisions and
intentions and for documenting results at many stages of the program
development process. Informal specifications are typically used
today, but they are imprecise and often ambiguous. Formal
specifications are precise and exact but are more difficult to write
and understand. We present work aimed toward enabling the practical
use of formal specifications in program development, concentrating
on the Clear language for structured algebraic specification.
Two different but equivalent denotational semantics for Clear are
given. One is a version of a semantics due to Burstall and Goguen
with a few corrections, in which the category-theoretic notion of a
colimit is used to define Clear's structuring operations
independently of the underlying 'institution' (logical formalism).
The other semantics defines the same operations by means of
straightforward set-theoretic constructions; it is not institutionindependent
but it can be modified to handle all institutions of
apparent interest.
Both versions of the semantics have been implemented. The settheoretic
implementation is by far the more useful of the two, and
includes a parser and typechecker. An implementation is useful for
detecting syntax and type errors in specifications, and can be used
as a front end for systems which manipulate specifications. Several
large specifications which have been processed by the set-theoretic
implementation are presented.
A semi-automatic theorem prover for Clear built on top of the
Edinburgh LCF system is described. It takes advantage of the
structure of Clear specifications to restrict the available
information to that which seems relevant to proving the theorem at
hand. If the system is unable to prove a theorem automatically the
user can attempt the proof interactively using the high-level
primitives and inference rules provided.
We lay a theoretical foundation for the use of Clear in
systematic program development by investigating a new notion of the
implementation of a specification by a lower-level specification.
This notion extends to handle parameterised specifications. We show
that this implementation relation is transitive and commutes with
Clear's structuring operations under certain conditions. This means
that a large specification can be refined to a program in a gradual
and modular fashion, where the correctness of the individual
refinements guarantees the correctness of the resulting program
A Key to Your Heart: Biometric Authentication Based on ECG Signals
In recent years, there has been a shift of interest towards the field of
biometric authentication, which proves the identity of the user using their
biological characteristics. We explore a novel biometric based on the
electrical activity of the human heart in the form of electrocardiogram (ECG)
signals. In order to explore the stability of ECG as a biometric, we collect
data from 55 participants over two sessions with a period of 4 months in
between. We also use a consumer-grade ECG monitor that is more affordable and
usable than a medical-grade counterpart. Using a standard approach to evaluate
our classifier, we obtain error rates of 2.4% for data collected within one
session and 9.7% for data collected across two sessions. The experimental
results suggest that ECG signals collected using a consumer-grade monitor can
be successfully used for user authentication.Comment: Appears in the "Who Are You?! Adventures in Authentication" workshop
(WAY 2019) co-located with the Symposium on Usable Privacy and Security
(SOUPS
- …